Introducing our 0.9 Process

Rex Hygate
DeFi Safety
Published in
8 min readSep 8, 2023

1. Overview

We are updating our protocol review process from 0.8 to 0.9. This article describes the changes and has all of the new questions with guidance. We will implement these changes on our website and will start using our new process in about a month. Now that we have finalized the process, we wanted to share it with our users. Comments are welcome.

1.1. Simple summary of changes

We added questions on active monitoring (on-chain and on website) as monitoring services are now available and have contributed to safety.

We revamped the access control section making it simpler and clearer. It is still very important from a score perspective. We now want explicit descriptions of all signers for crucial multisigs, including that they are provably distinct humans.

We now ask for a transaction signing policy — which is a new document that describes how their signers actually safely sign for the protocol changes. We will do a stand alone article on this shortly.

We added a request for a new “Audit Applicability” document that simply list the audits that are applicable to the deployed code. Euler was an example of where it was confusing to all if the attacked code was actually audited.

Reduced the score strength of documentation, increased testing — because testing is very important, more important than documentation and AI has made automated documentation possible, meaning great documentation might not increase safety as much as previously.

Finally if oracles are not applicable to a protocol, the questions are ignored.

2. 0.9 Questions

This section includes the contents, guidance and scoring for all Questions for 0.9.

2.1. Code And Team

This section looks at the code deployed on the relevant chains and team aspects. The document explaining these questions is here.

1. Are the smart contract addresses easy to find? (%)

Guidance same as 0.8
Percentage Score Guidance:

100% Clearly labelled and on website, documents or repository, quick to find
70% Clearly labelled and on website, docs or repo but takes a bit of looking
40% Addresses in mainnet.json, in discord or sub graph, etc
20% Address found but labeling not clear or easy to find
0% Executing addresses could not be found

2. Does the protocol have a public software repository? (Y/N)
Score Guidance:

Yes There is a public software repository with the code at a minimum, but also normally test and scripts. Even if the repository was created just to hold the files and has just 1 transaction.

No For teams with private repositories.

3. Is the team public (not anonymous)?

Guidance same as 0.8 Question 5
100% At least two names can be easily found in the protocol’s website, documentation or medium. These are then confirmed by the personal websites of the individuals / their linkedin / twitter.
50% At least one public name can be found to be working on the protocol.
0% No public team members could be found.

4. How responsive are the devs when we present our initial report?

Percentage Score Guidance:
100% Devs responded within 24hours
75% Devs responded within 48 hours
50% Devs responded within 72 hours
50% Data not entered yet
0% no dev response within 72 hours

2.2. Documentation

This section looks at the software documentation. The document explaining these questions is here.

5. Is there a whitepaper? (Y/N)

Score Guidance:

Yes There is an actual whitepaper or at least a very detailed doc on the technical basis of the protocol

No No whitepaper. Simple gitbook description of the protocol is not sufficient

6. Is the protocol’s software architecture documented? (%)

Percentage Score Guidance:
100% Detailed software architecture diagram with explanation
75% Basic block diagram of software aspects
0% no software architecture documentation[RH1] [RH2]

7. Does the software documentation fully cover the deployed contracts’ source code? (%)

Same as 0.8 Question 8
Percentage Score Guidance:
100% All contracts and functions documented
80% Only the major functions documented
79–1% Estimate of the level of software documentation
0% No software documentation

8. Is it possible to trace the documented software to its implementation in the protocol’s source code? (%)

Percentage Score Guidance:
100% will be Requirements with traceability to code and to tests (as in avionics DO-178)
90% on formal requirements with some traceability
80% for good autogen docs https://developers.morpho.xyz/
60% Clear association between code and documents via non explicit traceability
40% Documentation lists all the functions and describes their functions
0% No connection between documentation and code

9. Is the documentation organized to ensure information availability and clarity? (%)

Percentage Score Guidance:
100% Information is well organized, compartmentalized and easy to navigate
50% information is decently organized but could use some streamlining
0% information is generally obfuscated

2.3. Testing

10. Has the protocol tested their deployed code? (%)

Same as 0.8 Question 10

Percentage Score Guidance:
100% TtC > 120% Both unit and system test visible
80% TtC > 80% Both unit and system test visible
40% TtC < 80% Some tests visible
0% No tests obvious

11. How covered is the protocol’s code? (%)

Same as 0.8 Question 11

Percentage Score Guidance:
100% Documented full coverage
99–51% Value of test coverage from documented results
50% No indication of code coverage but clearly there is a complete set of tests
30% Some tests evident but not complete
0% No test for coverage seen

12. Is there a detailed report of the protocol’s test results? (%)

Same as 0.8 Question 13

Percentage Score Guidance:
100% Detailed test report
70% GitHub code coverage report visible
0% No test report evident

13. Has the protocol undergone Formal Verification? (Y/N)

Same as 0.8 Question 14

Score Guidance:
Yes Formal Verification was performed and the report is readily available
No Formal Verification was not performed and/or the report is not readily available.

2.4. Security

14. Is the protocol sufficiently audited? (%)

Same as 0.8 Question 16

Percentage Score Guidance:
100% Multiple Audits performed before deployment and the audit findings are public and implemented or not required
90% Single audit performed before deployment and audit findings are public and implemented or not required
70% Audit(s) performed after deployment and no changes required. The Audit report is public.
65% Code is forked from an already audited protocol and a changelog is provided explaining why forked code was used and what changes were made. This changelog must justify why the changes made do not affect the audit.
50% Audit(s) performed after deployment and changes are needed but not implemented.
30% Audit(s) performed are low-quality and do not indicate proper due diligence.
20% No audit performed
0% Audit Performed after deployment, existence is public, report is not public OR smart contract address’ not found.

Deduct 25% if the audited code is not available for comparison.

15. Is there a matrix of audit applicability on deployed code (%)? Please refer to the example doc for reference.

Percentage Score Guidance:
100% Current and clear matrix of applicability
50% Out of date matrix of applicability
0% no matrix of applicability

16. Is the bounty value acceptably high (%)

Same as 0.8 Question 17

Percentage Score Guidance:

100% Bounty is 10% TVL or at least $1M AND active program (see below)
90% Bounty is 5% TVL or at least 500k AND active program
80% Bounty is 5% TVL or at least 500k
70% Bounty is 100k or over AND active program
60% Bounty is 100k or over
50% Bounty is 50k or over AND active program
40% Bounty is 50k or over
20% Bug bounty program bounty is less than 50k
0% No bug bounty program offered / the bug bounty program is dead

An active program means that a third party (such as Immunefi) is actively driving hackers to the site. An inactive program would be static mentions on the docs.

17. Is there documented protocol monitoring (%)?

Percentage Score Guidance:
80% Documentation covering protocol specific threat monitoring
60% Documentation covering generic threat with incident response
40% Documentation covering operational monitoring with incident response
0% No on chain monitoring

20% for documented incident response process

18. Is there documented web site front end monitoring (%)?

Percentage Score Guidance:
25% for each of the elements documented. Documentation does need to be specific (for improved security).

1) DDOS Protection
2) DNS steps to protect the domain
3) Intrusion detection protection on the front end
4) Unwanted front-end modification detection

2.5. Access Controls

19. Is the protocol code immutable or upgradeable?

Percentage Score Guidance:
100% Fully Immutable
80% Updateable with Timelock > 1wk
50% Updateable code with Roles
30% Updateable code MultiSig
0% Updateable code via EOA

Pause control does not impact immutability

20. Is the protocol’s code upgradeability clearly explained in non technical terms?

Percentage Score Guidance:

100% Code is Immutable and clearly indicated so in documentation
100% Code is upgradeable and clearly explained in non technical terms
50% Code is upgradeable with minimal explanation
50% Code is immutable but this is not mentioned clearly in the documentation
0% No documentation on code upgradeability

21. Are the admin addresses, roles and capabilities clearly explained? (%)

Percentage Score Guidance:
100% Admin addresses, roles and capabilities clearly explained
80% Admin addresses, roles and capabilities incompletely explained but good content
40% Admin addresses, roles and capabilities minimally explained, information scattered
0% No information on admin addresses, roles and capabilities

22. Are the signers of the admin addresses clearly listed and provably distinct humans? (%)

Percentage Score Guidance:
100% All signers of the admin addresses are clearly listed and provably distinct humans
60% All signers of the admin addresses are clearly listed
30% Some signers of the admin addresses are listed
0% No documentation on the admin addresses

23. Is there a robust documented transaction signing policy? Please refer to the Example doc for reference.(%)

Percentage Score Guidance:
80% Robust transaction signing process (7 or more elements)
70% Adequate transaction signing process (5 or more elements)
60% Weak transaction signing process (3 or more elements)
0% No transaction signing process evident

Evidence of audits of signers following the process add 20%

2.6. Oracles

24. Are Oracles relevant? (Y/N)

Yes The protocol uses Oracles and the next 2 questions are relevant
NO If the protocol does not use Oracles, then the answer is No and the Oracle questions will not be answered or used in the final score for this protocol

25. Is the protocol’s Oracle sufficiently documented? (%)

Percentage Score Guidance:

100% The Oracle is specified. The contracts dependent on the oracle are identified. Basic software functions are identified (if the protocol provides its own price feed data). Timeframe of price feeds are identified.
75% The Oracle documentation identifies both source and timeframe but does not provide additional context regarding smart contracts.
50% Only the Oracle source is identified.
0% No oracle is named / no oracle information is documented.

26. Can flashloan attacks be applied to the protocol, and if so, are those flashloan attack risks mitigated? (Y/N)

Yes The protocol’s documentation includes information on how they mitigate the possibilities and extents of flash loan attacks.

No The protocol’s documentation does not include any information regarding the mitigation of flash loan attacks.

--

--